Sparse sign-consistent Johnson-Lindenstrauss matrices: compression with neuroscience-based constraints.

نویسندگان

  • Zeyuan Allen-Zhu
  • Rati Gelashvili
  • Silvio Micali
  • Nir Shavit
چکیده

Johnson-Lindenstrauss (JL) matrices implemented by sparse random synaptic connections are thought to be a prime candidate for how convergent pathways in the brain compress information. However, to date, there is no complete mathematical support for such implementations given the constraints of real neural tissue. The fact that neurons are either excitatory or inhibitory implies that every so implementable JL matrix must be sign consistent (i.e., all entries in a single column must be either all nonnegative or all nonpositive), and the fact that any given neuron connects to a relatively small subset of other neurons implies that the JL matrix should be sparse. We construct sparse JL matrices that are sign consistent and prove that our construction is essentially optimal. Our work answers a mathematical question that was triggered by earlier work and is necessary to justify the existence of JL compression in the brain and emphasizes that inhibition is crucial if neurons are to perform efficient, correlation-preserving compression.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Johnson-Lindenstrauss Lemma Meets Compressed Sensing

We show how two fundamental results in analysis related to n-widths and Compressed Sensing are intimately related to the Johnson-Lindenstrauss lemma. Our elementary approach is based on the same concentration inequalities for random inner products that have recently provided simple proofs of the Johnson-Lindenstrauss lemma. We show how these ideas lead to simple proofs of Kashin’s theorems on w...

متن کامل

Dimensionality reduction with subgaussian matrices: a unified theory

We present a theory for Euclidean dimensionality reduction with subgaussian matrices which unifies several restricted isometry property and Johnson-Lindenstrauss type results obtained earlier for specific data sets. In particular, we recover and, in several cases, improve results for sets of sparse and structured sparse vectors, low-rank matrices and tensors, and smooth manifolds. In addition, ...

متن کامل

Simple Analysis of Sparse, Sign-Consistent JL

Allen-Zhu, Gelashvili, Micali, and Shavit constructed a sparse, sign-consistent Johnson Lindenstrauss distribution, and proved that this distribution yields an essentially optimal dimension for the correct choice of sparsity. However, their analysis of the upper bound on the dimension and sparsity required a complicated combinatorial graph-based argument similar to Kane and Nelson’s analysis of...

متن کامل

Toward a Unified Theory of Sparse Dimensionality Reduction in Euclidean Space Jean Bourgain and Jelani Nelson

Let Φ ∈ Rm×n be a sparse Johnson-Lindenstrauss transform [KN] with s non-zeroes per column. For T a subset of the unit sphere, ε ∈ (0, 1/2) given, we study settings for m, s required to ensure E Φ sup x∈T ∣∣‖Φx‖22 − 1∣∣ < ε, i.e. so that Φ preserves the norm of every x ∈ T simultaneously and multiplicatively up to 1 + ε. In particular, our most general theorem shows that it suffices to set m = ...

متن کامل

New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property

Consider anm×N matrix Φ with the Restricted Isometry Property of order k and level δ, that is, the norm of any k-sparse vector in R is preserved to within a multiplicative factor of 1±δ under application of Φ. We show that by randomizing the column signs of such a matrix Φ, the resulting map with high probability embeds any fixed set of p = O(e) points in R into R without distorting the norm of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Proceedings of the National Academy of Sciences of the United States of America

دوره 111 47  شماره 

صفحات  -

تاریخ انتشار 2014